Incorporating Function Values into Quasi-Newton Updates

نویسنده

  • Samuel R. Buss
چکیده

The traditional quasi-Newton method for updating the approximate Hessian is based on the change in the gradient of the objective function. This paper describes a new update method that incorporates also the change in the value of the function. The method effectively uses a cubic approximation of the objective function to better approximate its directional second derivative. The cubic approximation is adjusted when necessary to ensure the positive definiteness of the approximating Hessian matrix. For the BFGS method this results in an average improvement of about 5% in performance. This improvement is modest, but can be obtained with a correspondingly modest change to the BFGS Hessian update algorithm. The best improvement is obtained with a line search algorithm that uses modified Wolfe conditions; however, in practice, the traditional line search algorithms can be used almost as effectively.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On Efficiently Computing the Eigenvalues of Limited-Memory Quasi-Newton Matrices

In this paper, we consider the problem of efficiently computing the eigenvalues of limited-memory quasi-Newton matrices that exhibit a compact formulation. In addition, we produce a compact formula for quasi-Newton matrices generated by any member of the Broyden convex class of updates. Our proposed method makes use of efficient updates to the QR factorization that substantially reduces the cos...

متن کامل

Self-Scaling Parallel Quasi-Newton Methods

In this paper, a new class of self-scaling quasi-Newton(SSQN) updates for solving unconstrained nonlinear optimization problems(UNOPs) is proposed. It is shown that many existing QN updates can be considered as special cases of the new family. Parallel SSQN algorithms based on this class of class of updates are studied. In comparison to standard serial QN methods, proposed parallel SSQN(SSPQN) ...

متن کامل

A combined class of self-scaling and modified quasi-Newton methods

Techniques for obtaining safely positive definite Hessian approximations with selfscaling and modified quasi-Newton updates are combined to obtain ‘better’ curvature approximations in line search methods for unconstrained optimization. It is shown that this class of methods, like the BFGS method has global and superlinear convergence for convex functions. Numerical experiments with this class, ...

متن کامل

Quasi-Newton updates with weighted secant equations

We provide a formula for variational quasi-Newton updates with multiple weighted secant equations. The derivation of the formula leads to a Sylvester equation in the correction matrix. Examples are given.

متن کامل

Low-rank Quasi-newton Updates for Robust Jacobian Lagging in Newton Methods

Newton-Krylov methods are standard tools for solving nonlinear problems. A common approach is to “lag” the Jacobian when assembly or preconditioner setup is computationally expensive, in exchange for some degradation in the convergence rate and robustness. We show that this degradation may be partially mitigated by using the lagged Jacobian as an initial operator in a quasi-Newton method, which...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006